Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama : move random seed generation to the samplers #9398

Merged
merged 5 commits into from
Sep 10, 2024
Merged

Conversation

slaren
Copy link
Collaborator

@slaren slaren commented Sep 10, 2024

Initializing a sampler with LLAMA_DEFAULT_SEED causes it to generate its own seeds. When the sampler is reset, a new seed is generated.

Note that LLAMA_DEFAULT_SEED being 0xFFFFFFFF results in rather unreadable number when printed in decimal. Might be easier to understand if LLAMA_DEFAULT_SEED is changed to zero.

This should fix the server generating always the same response.

@slaren slaren added the breaking change Changes that break ABIs, APIs, file formats, or other forms of backwards compatibility. label Sep 10, 2024
Comment on lines 1269 to 1270
// replace seed instead?
{"seed_cur", slot.smpl ? gpt_sampler_get_seed(slot.smpl) : 0},
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's ok as proposed.

@slaren slaren merged commit 49006c6 into master Sep 10, 2024
52 checks passed
@slaren slaren deleted the sl/sampling-fixes branch September 10, 2024 16:04
dsx1986 pushed a commit to dsx1986/llama.cpp that referenced this pull request Oct 29, 2024
* llama_sampler_penalties : clamp penalty_last_n to zero
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
breaking change Changes that break ABIs, APIs, file formats, or other forms of backwards compatibility. examples server
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants